502 research outputs found

    Regression Depth and Center Points

    Get PDF
    We show that, for any set of n points in d dimensions, there exists a hyperplane with regression depth at least ceiling(n/(d+1)). as had been conjectured by Rousseeuw and Hubert. Dually, for any arrangement of n hyperplanes in d dimensions there exists a point that cannot escape to infinity without crossing at least ceiling(n/(d+1)) hyperplanes. We also apply our approach to related questions on the existence of partitions of the data into subsets such that a common plane has nonzero regression depth in each subset, and to the computational complexity of regression depth problems.Comment: 14 pages, 3 figure

    Violator Spaces: Structure and Algorithms

    Get PDF
    Sharir and Welzl introduced an abstract framework for optimization problems, called LP-type problems or also generalized linear programming problems, which proved useful in algorithm design. We define a new, and as we believe, simpler and more natural framework: violator spaces, which constitute a proper generalization of LP-type problems. We show that Clarkson's randomized algorithms for low-dimensional linear programming work in the context of violator spaces. For example, in this way we obtain the fastest known algorithm for the P-matrix generalized linear complementarity problem with a constant number of blocks. We also give two new characterizations of LP-type problems: they are equivalent to acyclic violator spaces, as well as to concrete LP-type problems (informally, the constraints in a concrete LP-type problem are subsets of a linearly ordered ground set, and the value of a set of constraints is the minimum of its intersection).Comment: 28 pages, 5 figures, extended abstract was presented at ESA 2006; author spelling fixe

    A Multi-signal Variant for the GPU-based Parallelization of Growing Self-Organizing Networks

    Full text link
    Among the many possible approaches for the parallelization of self-organizing networks, and in particular of growing self-organizing networks, perhaps the most common one is producing an optimized, parallel implementation of the standard sequential algorithms reported in the literature. In this paper we explore an alternative approach, based on a new algorithm variant specifically designed to match the features of the large-scale, fine-grained parallelism of GPUs, in which multiple input signals are processed at once. Comparative tests have been performed, using both parallel and sequential implementations of the new algorithm variant, in particular for a growing self-organizing network that reconstructs surfaces from point clouds. The experimental results show that this approach allows harnessing in a more effective way the intrinsic parallelism that the self-organizing networks algorithms seem intuitively to suggest, obtaining better performances even with networks of smaller size.Comment: 17 page

    Construction and Analysis of Projected Deformed Products

    Full text link
    We introduce a deformed product construction for simple polytopes in terms of lower-triangular block matrix representations. We further show how Gale duality can be employed for the construction and for the analysis of deformed products such that specified faces (e.g. all the k-faces) are ``strictly preserved'' under projection. Thus, starting from an arbitrary neighborly simplicial (d-2)-polytope Q on n-1 vertices we construct a deformed n-cube, whose projection to the last dcoordinates yields a neighborly cubical d-polytope. As an extension of thecubical case, we construct matrix representations of deformed products of(even) polygons (DPPs), which have a projection to d-space that retains the complete (\lfloor \tfrac{d}{2} \rfloor - 1)-skeleton. In both cases the combinatorial structure of the images under projection is completely determined by the neighborly polytope Q: Our analysis provides explicit combinatorial descriptions. This yields a multitude of combinatorially different neighborly cubical polytopes and DPPs. As a special case, we obtain simplified descriptions of the neighborly cubical polytopes of Joswig & Ziegler (2000) as well as of the ``projected deformed products of polygons'' that were announced by Ziegler (2004), a family of 4-polytopes whose ``fatness'' gets arbitrarily close to 9.Comment: 20 pages, 5 figure

    Prodsimplicial-Neighborly Polytopes

    Get PDF
    Simultaneously generalizing both neighborly and neighborly cubical polytopes, we introduce PSN polytopes: their k-skeleton is combinatorially equivalent to that of a product of r simplices. We construct PSN polytopes by three different methods, the most versatile of which is an extension of Sanyal and Ziegler's "projecting deformed products" construction to products of arbitrary simple polytopes. For general r and k, the lowest dimension we achieve is 2k+r+1. Using topological obstructions similar to those introduced by Sanyal to bound the number of vertices of Minkowski sums, we show that this dimension is minimal if we additionally require that the PSN polytope is obtained as a projection of a polytope that is combinatorially equivalent to the product of r simplices, when the dimensions of these simplices are all large compared to k.Comment: 28 pages, 9 figures; minor correction

    Bounding Helly numbers via Betti numbers

    Get PDF
    We show that very weak topological assumptions are enough to ensure the existence of a Helly-type theorem. More precisely, we show that for any non-negative integers bb and dd there exists an integer h(b,d)h(b,d) such that the following holds. If F\mathcal F is a finite family of subsets of Rd\mathbb R^d such that β~i(G)b\tilde\beta_i\left(\bigcap\mathcal G\right) \le b for any GF\mathcal G \subsetneq \mathcal F and every 0id/210 \le i \le \lceil d/2 \rceil-1 then F\mathcal F has Helly number at most h(b,d)h(b,d). Here β~i\tilde\beta_i denotes the reduced Z2\mathbb Z_2-Betti numbers (with singular homology). These topological conditions are sharp: not controlling any of these d/2\lceil d/2 \rceil first Betti numbers allow for families with unbounded Helly number. Our proofs combine homological non-embeddability results with a Ramsey-based approach to build, given an arbitrary simplicial complex KK, some well-behaved chain map C(K)C(Rd)C_*(K) \to C_*(\mathbb R^d).Comment: 29 pages, 8 figure

    Artificial Intelligence Models in the Diagnosis of Adult-Onset Dementia Disorders: A Review

    Get PDF
    Background: The progressive aging of populations, primarily in the industrialized western world, is accompanied by the increased incidence of several non-transmittable diseases, including neurodegenerative diseases and adult-onset dementia disorders. To stimulate adequate interventions, including treatment and preventive measures, an early, accurate diagnosis is necessary. Conventional magnetic resonance imaging (MRI) represents a technique quite common for the diagnosis of neurological disorders. Increasing evidence indicates that the association of artificial intelligence (AI) approaches with MRI is particularly useful for improving the diagnostic accuracy of different dementia types. Objectives: In this work, we have systematically reviewed the characteristics of AI algorithms in the early detection of adult-onset dementia disorders, and also discussed its performance metrics. Methods: A document search was conducted with three databases, namely PubMed (Medline), Web of Science, and Scopus. The search was limited to the articles published after 2006 and in English only. The screening of the articles was performed using quality criteria based on the Newcastle-Ottawa Scale (NOS) rating. Only papers with an NOS score ≥ 7 were considered for further review. Results: The document search produced a count of 1876 articles and, because of duplication, 1195 papers were not considered. Multiple screenings were performed to assess quality criteria, which yielded 29 studies. All the selected articles were further grouped based on different attributes, including study type, type of AI model used in the identification of dementia, performance metrics, and data type. Conclusions: The most common adult-onset dementia disorders occurring were Alzheimer's disease and vascular dementia. AI techniques associated with MRI resulted in increased diagnostic accuracy ranging from 73.3% to 99%. These findings suggest that AI should be associated with conventional MRI techniques to obtain a precise and early diagnosis of dementia disorders occurring in old age

    Conflicting phylogenomic signals reveal a pattern of reticulate evolution in a recent high‐Andean diversification (Asteraceae: Astereae: Diplostephium)

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/137610/1/nph14530_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/137610/2/nph14530.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/137610/3/nph14530-sup-0001-SupInfo.pd

    On the use of cartographic projections in visualizing phylo-genetic tree space

    Get PDF
    Phylogenetic analysis is becoming an increasingly important tool for biological research. Applications include epidemiological studies, drug development, and evolutionary analysis. Phylogenetic search is a known NP-Hard problem. The size of the data sets which can be analyzed is limited by the exponential growth in the number of trees that must be considered as the problem size increases. A better understanding of the problem space could lead to better methods, which in turn could lead to the feasible analysis of more data sets. We present a definition of phylogenetic tree space and a visualization of this space that shows significant exploitable structure. This structure can be used to develop search methods capable of handling much larger data sets

    An efficient and extensible approach for compressing phylogenetic trees

    Get PDF
    Biologists require new algorithms to efficiently compress and store their large collections of phylogenetic trees. TreeZip is a novel method for compressing phylogenetic trees. Recently, we extended our TreeZip algorithm to support branch lengths and show how it can be used to extract sets of trees of interest quickly. The key advantage of TreeZip over standard compression methods like 7zip is its ability to interpret and compress tree collections semantically, making it immune to branch rotations and allowing key operations (such calculating a consensus tree) to be performed quickly and without a loss of space savings. On unweighted phylogenetic trees, TreeZip is able to compress Newick files in excess of 98%. On weighted phylogenetic trees, TreeZip is able to compress a Newick file by at least 73%. TreeZip can be combined with 7zip with little overhead, allowing space savings in excess of 99 % (unweighted) and 92%(weighted). Unlike TreeZip, 7zip is not immune to branch rotations, and performs worse as the level of variability in the Newick string representation increases. Finally, since the TreeZip compressed text (TRZ) file contains all the semantic information in a collection of trees, we can easily filter and decompress a subset of trees of interest (such as the set of unique trees), or build the resulting consensus tree in a matter of seconds. We also show the ease of which set operations can be performed on TRZ files, at speeds quicker than those performed on Newick or 7zip compressed Newick files, and without loss of space savings. TreeZip is an efficient approach for compressing large collections of phylogenetic trees. The semantic and compact nature of the TRZ file allow it to be operated upon directly and quickly, without a need to decompress the original Newick file. We believe that TreeZip will be vital for compressing and archiving trees in the biological community.
    corecore